1,578 research outputs found

    A Compilation Target for Probabilistic Programming Languages

    Get PDF
    Forward inference techniques such as sequential Monte Carlo and particle Markov chain Monte Carlo for probabilistic programming can be implemented in any programming language by creative use of standardized operating system functionality including processes, forking, mutexes, and shared memory. Exploiting this we have defined, developed, and tested a probabilistic programming language intermediate representation language we call probabilistic C, which itself can be compiled to machine code by standard compilers and linked to operating system libraries yielding an efficient, scalable, portable probabilistic programming compilation target. This opens up a new hardware and systems research path for optimizing probabilistic programming systems.Comment: In Proceedings of the 31st International Conference on Machine Learning (ICML), 201

    Does Your Personality Trait Affect Behavior on Social Media

    Get PDF
    The purpose of this research was to find whether personality trait affects behavior on social media. An online survey was made to determine personality trait and specific behaviors seen on social media. The survey was distributed online. Through the data I collected I found that extroverts use social media more than introverts and introverts do use game, music, and entertainment apps more than extroverts do. I found that extroverts do comment more than introverts. Extroverts were heavily concentrated in the frequently post and rarely post sections, whereas, introverts were clustered in the very rarely post and rarely post sections. I also found that extroverts spend the most time on apps they enjoy than introverts do. Introverts tend to talk to those who are more familiar to them, whereas, extroverts talked to people who they did not know. I found that more extroverts believed that they communicate the same as in person and in social media. I found that extroverts and introverts take sort of the same stance with most of them saying that social media could have affected their personality

    How Therapy Affects the Counselor: Development through Play Therapy Practice and Supervision

    Get PDF
    Therapeutic relationships and counselor qualities as contributions to therapeutic relationships are widely recognized as critical to counseling outcome (Norcross, 2011). Counselors in training (CITs) tend to possess certain traits at certain stages, such as high anxiety, lack of confidence, and a high focus on self in an early stage of development. Child-centered play therapy (CCPT) represents a specialization within counseling, and the current research highlights how the CITs learn CCPT within the classroom (Fall, Drew, Chute, & More, 2007; Homeyer & Rae, 1998; Kao & Landreth, 1997; Lindo et al., 2012; Ray, 2004; Ritter & Chang, 2002; Tanner & Mathis, 1995) but not within training experiences. Thus, the purpose of this study is to explore how counselors develop during an early training experience in CCPT utilizing a case study (Stake, 1995; Yin, 1994, 2003) of an existing supervision group. I utilized the Integrative Developmental Model (IDM) as a theoretical lens to better understand the participants. I used the constant comparative method (Lincoln & Guba, 1985; Merriam, 1998) to analyze the online blogs, semi structured interviews, and focus group. Several themes emerged including counselor development, empathy-shared experience, performance anxiety, confidence development, buy-in, skill development, greater understanding of theory, greater understanding of self, as well as valuing supervision and the supervisory relationship. I discuss the findings in detail in relationship to the current research on counselor development and within the context of IDM. I also provide the implications for counselor educators and supervisors, as well as outline ideas for future research

    Money Market Funds: Analyzing Reform

    Get PDF
    This thesis analyzes money market funds since inception in 1983 through the financial crisis in 2008 to current day. Specifically, it discusses the newly implemented reform and if this reform will be an effective way to minimize money market fund systemic risk

    Grammar Variational Autoencoder

    Get PDF
    Deep generative models have been wildly successful at learning coherent latent representations for continuous data such as video and audio. However, generative modeling of discrete data such as arithmetic expressions and molecular structures still poses significant challenges. Crucially, state-of-the-art methods often produce outputs that are not valid. We make the key observation that frequently, discrete data can be represented as a parse tree from a context-free grammar. We propose a variational autoencoder which encodes and decodes directly to and from these parse trees, ensuring the generated outputs are always valid. Surprisingly, we show that not only does our model more often generate valid outputs, it also learns a more coherent latent space in which nearby points decode to similar discrete outputs. We demonstrate the effectiveness of our learned models by showing their improved performance in Bayesian optimization for symbolic regression and molecular synthesis

    Fast and Scalable Spike and Slab Variable Selection in High-Dimensional Gaussian Processes

    Get PDF
    Variable selection in Gaussian processes (GPs) is typically undertaken by thresholding the inverse lengthscales of automatic relevance determination kernels, but in high-dimensional datasets this approach can be unreliable. A more probabilistically principled alternative is to use spike and slab priors and infer a posterior probability of variable inclusion. However, existing implementations in GPs are very costly to run in both high-dimensional and large-n datasets, or are only suitable for unsupervised settings with specific kernels. As such, we develop a fast and scalable variational inference algorithm for the spike and slab GP that is tractable with arbitrary differentiable kernels. We improve our algorithm’s ability to adapt to the sparsity of relevant variables by Bayesian model averaging over hyperparameters, and achieve substantial speed ups using zero temperature posterior restrictions, dropout pruning and nearest neighbour minibatching. In experiments our method consistently outperforms vanilla and sparse variational GPs whilst retaining similar runtimes (even when n=10^6) and performs competitively with a spike and slab GP using MCMC but runs up to 1000 times faster

    Bayesian Graph Neural Networks for Molecular Property Prediction

    Get PDF
    Graph neural networks for molecular property prediction are frequently underspecified by data and fail to generalise to new scaffolds at test time. A potential solution is Bayesian learning, which can capture our uncertainty in the model parameters. This study benchmarks a set of Bayesian methods applied to a directed MPNN, using the QM9 regression dataset. We find that capturing uncertainty in both readout and message passing parameters yields enhanced predictive accuracy, calibration, and performance on a downstream molecular search task.Comment: Presented at NeurIPS 2020 Machine Learning for Molecules worksho
    • …
    corecore